首页> 外文OA文献 >Generalizing the optimized gradient method for smooth convex minimization
【2h】

Generalizing the optimized gradient method for smooth convex minimization

机译:推广优化梯度法求平滑凸   最小化

代理获取
本网站仅为用户提供外文OA文献查询和代理获取服务,本网站没有原文。下单后我们将采用程序或人工为您竭诚获取高质量的原文,但由于OA文献来源多样且变更频繁,仍可能出现获取不到、文献不完整或与标题不符等情况,如果获取不到我们将提供退款服务。请知悉。

摘要

The optimized gradient method (OGM) was recently developed by optimizing thestep coefficients of first-order methods with respect to the function value.This OGM has a per-iteration computational cost that is similar to Nesterov'sfast gradient method (FGM), and satisfies a worst-case convergence bound of thefunction value that is twice smaller than that of FGM. Moreover, OGM was shownrecently to achieve the optimal cost function complexity bound of first-ordermethods (with either fixed or dynamic step sizes) for smooth convexminimization. Considering that OGM is superior to the widely used FGM forsmooth convex minimization with respect to the worst-case performance of thefunction decrease, it is desirable to further understand the formulation andconvergence analysis of OGM. Therefore, this paper studies a generalizedformulation of OGM and its convergence analysis in terms of both the functionvalue and the gradient. We then optimize the step coefficients of first-ordermethods in terms of the rate of decrease of the norm of the gradient. Thisanalysis leads to a new algorithm called OGM-OG that has the best knownanalytical worst-case bound on the decrease of the gradient norm amongfixed-step first-order methods.
机译:最近通过针对函数值优化一阶方法的阶跃系数来开发优化梯度法(OGM),该OGM具有与Nesterov的快速梯度法(FGM)类似的每迭代计算成本,并且可以满足函数值的最坏情况收敛范围比FGM小两倍。此外,最近还显示了OGM可以实现一阶方法(具有固定或动态步长)的最优成本函数复杂度界限,以实现平滑凸最小化。考虑到在功能降低的最坏情况下,OGM优于用于平滑凸极小化的广泛使用的FGM,因此希望进一步了解OGM的公式和收敛性分析。因此,本文从函数值和梯度两方面研究了OGM的广义公式及其收敛性分析。然后,我们根据梯度范数的减少率来优化一阶方法的阶跃系数。这种分析导致了一种称为OGM-OG的新算法,该算法在固定步阶一阶方法中对梯度范数的减小具有最广为人知的分析性最坏情况。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
代理获取

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号